Latency-avoiding Dynamic Optical Circuit Prefetching Using Application-specific Predictors

نویسندگان

  • Ke Wen
  • Sébastien Rumley
  • Jeremiah Wilke
  • Keren Bergman
چکیده

Optical circuits are capable of providing large bandwidth improvements relative to electrical-switched networks in high-performance computing (HPC) systems. However, due to the circuit switching nature of optical systems, setup delays may prevent HPC systems from fully utilizing the available bandwidth. This paper proposes an applicationguided circuit management technique that can achieve latency-avoiding dynamic reconfiguration, better leveraging the high-bandwidth photonics and accelerating system performance. By learning the temporal locality and communication patterns from upper-layer applications, the technique not only caches a set of circuits to maximize reuse, but also prefetches predicted circuits to actively hide the setup latency. We apply the technique to communication patterns from a spectrum of science and engineering applications. The results show that setup delays via circuit misses are significantly reduced, showing how the proposed technique can improve circuit switching in HPC optical interconnects.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Using Dynamic Sets to Reduce the Aggregate Latency of Data Access

Many users of large distributed systems are plagued by high latency when accessing remote data. Latency is particularly problematic for the critical application of search and retrieval, which tends to access many objects and may su er a long wait for each object accessed. Existing techniques like caching, inferential prefetching, and explicit prefetching are not suited to search, are ine ective...

متن کامل

Non-Sequential Instruction Cache Prefetching for Multiple-Issue Processors

This paper presents a novel instruction cache prefetching mechanism for multiple-issue processors. Such processors at high clock rates often have to use a small instruction cache which can have significant miss rates. Prefetching from secondary cache or even memory can hide the instruction cache miss penalties, but only if initiated sufficiently far ahead of the current program counter. Existin...

متن کامل

Library-based Prefetching for Pointer-intensive Applications

Processor speed has been improving faster than memory latency for over two decades. Thus, an increased portion of execution time is spent stalling on loads, waiting for data from the memory hierarchy. Prefetching is an effective mechanism to hide memory latency for applications with low temporal locality. However, existing hardware prefetching techniques work well for array-based programs but n...

متن کامل

Exploitation of Location-dependent Caching and Prefetching Techniques for Supporting Mobile Computing and Communications

Global mobility and connectivity of mobile computing and communication can be satisfied by integration of existing and future communication networks. Techniques that can cope with the varying bandwidth, multiple networks and the latency of mobile accesses are essential for supporting performance transparency of mobile computing and communication systems of the future. In this paper, we present ...

متن کامل

Web Prefetching with High Accuracy and Low Memory Cost

Prefetching algorithms can effectively reduce Web latency and dramatically improve responsiveness of interactive Web application. We propose a new history-based prefetching algorithm that achieves very high prediction accuracy, generates little overhead traffic, and allows users to bound the amount of memory that it uses. We also propose a method to find accurate upper bounds on the performance...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015